Quantitative Stability Analysis for Minimax Distributionally Robust Risk Optimization
نویسندگان
چکیده
This paper considers distributionally robust formulations of a two stage stochastic programming problem with the objective of minimizing a distortion risk of the minimal cost incurred at the second stage. We carry out stability analysis by looking into variations of the ambiguity set under the Wasserstein metric, decision spaces at both stages and the support set of the random variables. In the case when it is risk neutral, the stability result is presented with the variation of the ambiguity set being measured by generic metrics of ζ-structure, which provides a unified framework for quantitative stability analysis under various metrics including total variation metric and Kantorovich metric. When the ambiguity set is structured by a ζ-ball, we find that the Hausdorff distance between two ζ-balls is bounded by the distance of their centres and difference of their radius. The findings allow us to strengthen some recent convergence results on distributionally robust optimization where the centre of the Wasserstein ball is constructed by the empirical probability distribution.
منابع مشابه
Quantitative Stability Analysis for Distributionally Robust Optimization with Moment Constraints
In this paper we consider a broad class of distributionally robust optimization (DRO for short) problems where the probability of the underlying random variables depends on the decision variables and the ambiguity set is defined through parametric moment conditions with generic cone constraints. Under some moderate conditions including Slater type conditions of cone constrained moment system an...
متن کاملStochastic Gradient Methods for Distributionally Robust Optimization with f-divergences
We develop efficient solution methods for a robust empirical risk minimization problem designed to give calibrated confidence intervals on performance and provide optimal tradeoffs between bias and variance. Our methods apply to distributionally robust optimization problems proposed by Ben-Tal et al., which put more weight on observations inducing high loss via a worst-case approach over a non-...
متن کاملMinimax Statistical Learning and Domain Adaptation with Wasserstein Distances
As opposed to standard empirical risk minimization (ERM), distributionally robust optimization aims to minimize the worst-case risk over a larger ambiguity set containing the original empirical distribution of the training data. In this work, we describe a minimax framework for statistical learning with ambiguity sets given by balls in Wasserstein space. In particular, we prove a generalization...
متن کاملDistributionally Robust Optimization with Matrix Moment Constraints: Lagrange Duality and Cutting Plane Methods1
A key step in solving minimax distributionally robust optimization (DRO) problems is to reformulate the inner maximization w.r.t. probability measure as a semiinfinite programming problem through Lagrange dual. Slater type conditions have been widely used for zero dual gap when the ambiguity set is defined through moments. In this paper, we investigate effective ways for verifying the Slater ty...
متن کاملKullback-Leibler Divergence Constrained Distributionally Robust Optimization
In this paper we study distributionally robust optimization (DRO) problems where the ambiguity set of the probability distribution is defined by the Kullback-Leibler (KL) divergence. We consider DRO problems where the ambiguity is in the objective function, which takes a form of an expectation, and show that the resulted minimax DRO problems can be formulated as a one-layer convex minimization ...
متن کامل